A Computational Test of the Information-Theory Based Entropy Theory of Perception: Does It Actually Generate the Stevens and Weber-Fechner Laws of Sensation?

نویسنده

  • Lance Nizami
چکیده

K.H. Norwich et al. used Shannon Information Theory to derive their Entropy Theory of Perception (1975-present). The Entropy Theory produces the Entropy Equation, which relates the strength of sensation (represented by magnitude estimates) to the intensity of the sensory stimulus. At “high” intensities, the relation is approximately logarithmic, which Norwich et al. dubbed “the Weber-Fechner Law”. At “low” intensities, the relation is approximately a power function, dubbed “Stevens’ Law”. Unfortunately, the Entropy Equation has three unknowns, so that what constitutes “high” and “low” can only be established through curve-fitting. Remarkably, the latter was never done. Establishing parameter values is especially important because one of the unknowns is a power exponent (the “Entropy Exponent”, here denoted y) said to be identical in value to “Stevens’ exponent” (here denoted x). The identity y=x was crucial to the numerous published applications of the Entropy Theory to psychophysical and neurophysiological phenomena. Curve-fitting of the Entropy Equation to magnitude estimates would therefore establish the ranges of the “Weber-Fechner” and “Stevens” laws and reveal whether y=x. The present author did the curve-fitting, following the custom in the literature: logarithmic forms of the Entropy Equation and Stevens’ Law were fitted by least-squares regression to logarithm(magnitude-estimate) vs. logarithm(stimulus-strength) taken from 64 published curves of magnitude estimates. The resulting relation of y to x was broadly scattered; 62/64 times, y exceeded x. In theory, the fitted Entropy Equation allows calculation of the information transmitted in perception. Hence the regressions were re-run conditional to an information transmitted of 2.5 bits/stimulus, the mean value in the literature. y≈1.7x under the constrained regression. Altogether, the purported equality of the Entropy Exponent and Stevens’ exponent was not confirmed. Further, neither the “Weber-Fechner Law” nor the “Stevens’ Law” derived from any fitted Entropy Equation described the entire range of the respective magnitude estimation curve, contrary to the formal use of those laws. Norwich’s later quantification of sensation growth by “physical entropy” makes identical mistakes. All of this emphasizes that the Entropy Theory does not derive rules of sensory perception from information theory, and it is recommended that further attempts to do so should be discouraged. Manuscript received March 25, 2009. Work supported by the author. L. Nizami is presently an Independent Research Scholar in Decatur, GA (404-299-5530; e-mail: [email protected]). Research commenced at Dept. of Psychology, University of Toronto in Mississauga, 3359 Mississauga Rd. N., Mississauga, ON, Canada, and continued at Center for Hearing Research, Boys Town National Research Hospital, Omaha, NE 68131, USA.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A multi agent method for cell formation with uncertain situation, based on information theory

This paper assumes the cell formation problem as a distributed decision network. It proposes an approach based on application and extension of information theory concepts, in order to analyze informational complexity in an agent- based system, due to interdependence between agents. Based on this approach, new quantitative concepts and definitions are proposed in order to measure the amount of t...

متن کامل

On the Misapplication of Cybernetics to Sensory Neurons: Norwich’s Informational Entropy Theory of Perception Has Not Derived Stevens’ Law for Taste

Norwich’s Entropy Theory of Perception reveals a startling conclusion: that Stevens’ Law with an Index of 1, a power function stating direct proportionality between perceived taste intensity and stimulus concentration, arises purely from theory. Norwich’s theorizing starts with extraordinary hypotheses. First, “multiple, parallel receptor-neuron units” without collaterals “carry essentially the...

متن کامل

Logarithmic and Power Law Input-Output Relations in Sensory Systems with Fold-Change Detection

Two central biophysical laws describe sensory responses to input signals. One is a logarithmic relationship between input and output, and the other is a power law relationship. These laws are sometimes called the Weber-Fechner law and the Stevens power law, respectively. The two laws are found in a wide variety of human sensory systems including hearing, vision, taste, and weight perception; th...

متن کامل

Evaluation of monitoring network density using discrete entropy theory

The regional evaluation of monitoring stations for water resources can be of great importance due to its role in finding appropriate locations for stations, the maximum gathering of useful information and preventing the accumulation of unnecessary information and ultimately reducing the cost of data collection. Based on the theory of discrete entropy, this study analyzes the density of rain gag...

متن کامل

Clustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information

Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009